Advertisement

See Spot spy? A new generation of police robots faces backlash

A woman holds up a sign that says, "Armed robots aren't the answer."
A woman holds up a sign outside San Francisco City Hall at a demonstration against the use of robots by the city’s police department.
(Jeff Chiu / Associated Press)
Share via

Spot isn’t like other police dogs.

For starters, it has no head. Or fur. And instead of kibble and water, it runs on a lithium-ion battery.

When the four-legged robot, which can climb stairs, open doors and transmit 360-degree video, was unveiled a few years ago, it was billed as a potent new tool for industries whose workers are often in dangerous conditions. It could, for example, detect radiation for an energy company or inspect the safety of a mining tunnel, its creator, Boston Dynamics, said in promotional material.

And police officials around the U.S. realized that Spot, which its inventors named, also offered an upgrade from the slower, less agile robots currently used in high-risk situations such as negotiating with hostage takers and assessing suspicious packages.

Advertisement

The Los Angeles Police Department decided it needed to have a Spot. It turned to the L.A. Police Foundation, which raises money for the department, to cover the nearly $280,000 price tag that included upgrades and warranties.

But the LAPD’s expected purchase has been met with opposition from critics who worry the technology represents a dangerous new frontier in policing as law enforcement in Los Angeles and elsewhere looks to incorporate smarter, more capable robots into the job in new ways. Opponents to the technology have mounted campaigns to push back against the use of robots, drones and other automated devices by police, saying they are a threat to people’s privacy and safety.

That debate played out recently in San Francisco, where public backlash defeated a plan that would have allowed the city’s police to weaponize robots for use in killing people in certain situations.

Advertisement

“Piecemeal efforts” by local officials to regulate police use of such technology have largely failed to keep pace with a rapidly evolving field of robotics, said Elizabeth Joh, a UC Davis law professor.

“Most of the way we think about how the law regulates police assumes a human being making human decisions in a face-to-face encounter with the public,” said Joh, who specializes in policing, privacy and technology. “But the more and more we use this technology, there is increasing reliance by police on machine-made decisions.”

Standing almost 28 inches and weighing 70 pounds, Spot is roughly the size of a full-grown Dalmatian. Equipped with 360-degree cameras, the robots collect and process information about their surroundings, which is transmitted in real time to an officer controlling their movements with a tablet-like device. It can be customized with a mechanical arm to open doors, or sophisticated sensors capable of detecting chemical spills and creating a three-dimensional map of an area.

Advertisement

In announcing the plan to purchase the robotic dog last month, LAPD Chief Michel Moore said the device would serve only as officers’ eyes and ears in a “narrow set” of dangerous situations, and not be used in everyday patrols or for surveillance of crime suspects.

Its use would need to be approved by one of the department’s deputy chiefs and the chief notified.

The robot could help officers gather information during operations typically handled by the department’s SWAT teams, such as an active shooter, barricaded suspect or explosive device, without endangering lives, said Capt. Brian Bixler, who oversees the LAPD’s Metropolitan Division, which includes SWAT.

In a scenario involving a holed-up suspect, for instance, police might send in the robot to see which room the person is in and whether they are armed, and to communicate with the person through a built-in intercom system in hopes of resolving the situation without using force, he said.

“We would much rather put a remote system in to see what’s going on, rather than sending a human or live canine in,” Bixler said.

It might also be deployed during a natural disaster or a hazardous material spill to provide information to first responders, he said.

Advertisement

With its ability to maneuver over or around obstacles and semiautonomous operating system, Bixler said a Spot would be more useful in these types of situations than the 11 wheeled or tracked robots and five drones the department currently uses.

Speaking to the Police Commission last month, Moore said the department had learned from the upheaval that followed the New York City Police Department’s use of the same device last year. The nation’s largest police force acquired Spot in 2020 (and renamed it Digidog), but its use didn’t get widespread attention until the following year when a video showed the robot trotting alongside NYPD officers during a hostage situation at a high-rise public housing building. The public outcry grew a few months later when police deployed it at another public housing building as they apprehended an armed suspect.

The robot’s appearances tapped into deep-seated distrust of police among poor communities of color. Critics denounced it as a waste of resources and a high-tech surveillance tool police would misuse on Black and Latino people.

“Now robotic surveillance ground drones are being deployed for testing on low-income communities of color with underresourced schools,” Rep. Alexandria Ocasio-Cortez (D-N.Y.) tweeted at the time.

The city’s Police Department abruptly broke off its contract with Boston Dynamics and returned the robot.

Similar protests broke out in San Francisco, where police sought permission to use robots armed with weapons in limited circumstances.

The request was apparently made in light of a 2016 rampage in Dallas in which a gunman killed five police officers, injured others and led authorities in a chaotic, hours-long hunt.

Advertisement

In an unprecedented move for American law enforcement, police detonated a bomb on a robot to kill the gunman, who was cornered in a garage and refused to surrender.

The proposed San Francisco ordinance was introduced with promises of narrow use and restraint. It would allow some senior police officials to authorize the use of robots “as a deadly force option” in limited, violent situations when less extreme measures have failed.

Shamann Walton, the Board of Supervisors’ president, said he hopes the public continues to speak out. Though police won’t be allowed to arm robots in the near future, the issue has been sent to a committee for further discussion and the supervisors could vote again sometime in the future.

“My hope is ... that we listen to the dangers and the harms that the community feels would come from this policy, and that we abandon it and get to focusing on making San Francisco more affordable, keeping people safe,” Walton said, echoing some of the complaints raised by activists in New York and Los Angeles.

In a statement to The Times, San Francisco Police Chief Bill Scott said public discourse surrounding the issue of arming robots “has become distorted.”

“We want to use our robots to save lives — not take them,” Scott said. “To be sure, this is about neutralizing a threat by equipping a robot with a lethal option as a last-case scenario, not sending an officer in on a suicide mission.”

Advertisement

A similar proposal in neighboring Oakland was rejected, and voters there will soon decide whether police robots should be equipped with pepper spray.

Geoff Alpert, a University of South Carolina criminology professor and an expert on police use of force, said the central question surrounding robotics in policing is not whether they should be used, but how the people overseeing police departments should shape policies to guide their use.

“No one is saying you give the machine the authority to make a decision” whether to use deadly force, Alpert said. “We’re just saying the decision has been made, now let’s apply it with a machine as opposed to a person.”

Others believe the case for adopting new technologies isn’t so clear-cut.

Joanna Schwartz, a UCLA law professor and expert on police misconduct litigation, said increases in law enforcement power are often justified at first by extreme circumstances, such as the 2016 rampage in Dallas, but can become a slippery slope toward more common use.

“Those sort of horror stories, or worst examples, can and have opened the door for much more use of that power beyond the most horrific situation,” Schwartz said. “Historically, when we grant police power or discretion or advanced technologies, they tend to be used in many more situations.”

The issue has in recent months gotten some attention from state lawmakers. Last week, Assemblywoman Akilah Weber (D-San Diego) announced that she had introduced a bill to “regulate, limit and require the reporting of the use of deadly force by a law enforcement agency by means of remotely operated equipment.”

Advertisement

“With several cities in California considering policies to govern law enforcement’s use of deadly force by remotely-operated equipment, it is time to begin discussions about these devices,” she tweeted.

For the record:

10:32 a.m. Dec. 21, 2022An earlier version of this story said the robot appeared on “Jimmy Kimmel Live!” It appeared on “The Tonight Show Starring Jimmy Fallon.”

Outside of the controversy in policing, Spot has received benign attention, appearing in viral social media videos, dancing to pop songs. Earlier this year, Jimmy Fallon featured the robot on an episode of “The Tonight Show” and Boston Dynamics has highlighted the technology’s use in countries worldwide. For example, London’s Heathrow Airport is using one of the robots to do 3-D laser scans of a 1960s-era cargo tunnel that’s being refurbished. In Ukraine, it’s being used to sweep for mines.

In a brief statement to The Times, the company said that Spot’s use “in public safety applications” was to “keep people out of harm’s way and help first responders assess hazardous situations.” It cited the robot’s use by several law enforcement agencies in the U.S. and elsewhere, including Houston, where Spot was used in the apprehension of a barricaded murder suspect, and the Netherlands, where it’s been deployed to sweep suspected drug labs before officers are sent in.

And in an open letter signed earlier this year by Boston Dynamics and a handful of other robotics companies, the group pledged not to “weaponize our advanced-mobility general-purpose robots or the software we develop that enables advanced robotics and we will not support others to do so.”

“The emergence of advanced mobile robots offers the possibility of misuse. Untrustworthy people could use them to invade civil rights or to threaten, harm, or intimidate others,” the letter read.

Another company, Ghost Robotics, has started marketing a weaponized dog-like robot to several branches of the U.S. military and its allies. The Philadelphia company did not respond to an email seeking comment.

Advertisement

Hamid Khan, a member of the Stop LAPD Spying Coalition, sees the LAPD’s interest in Spot as part of a broader push by police to fashion themselves after the military with increasingly high-tech tools.

Khan said his group is opposed to civilian oversight boards that the LAPD and other police departments have proposed to oversee police technology. Such bodies, he argued, give police political cover for expanding their surveillance capabilities. Instead, he said, the coalition and other groups support an outright ban on police using robotics and other surveillance technology.

While most law enforcement agencies are currently using Spot and similar robots only for reconnaissance in crisis situations, Carolin Kemper, a researcher at the German Research Institute for Public Administration, said the rush to acquire the technology without proper safeguards opens the door for future abuses.

With public trust in law enforcement at historic lows, deploying a robot like Spot only reinforces the image of police as a “faceless” government entity that is disconnected from the communities they patrol, Kemper and her research partner Michael Kolain said via a Skype interview.

Kolain also questioned how police might use images, words and other information collected by robots.

“It could become a really scary tool of surveillance because from the outside you don’t know what’s going on,” he said. “If I talk back, will my voice be recorded? All these things we don’t know.”

Advertisement

The Associated Press contributed to this report.

Advertisement